# Small Parameter Fine-tuning
Sqft Phi 3 Mini 4k 60 Base
Apache-2.0
A sparsified model based on microsoft/Phi-3-mini-4k-instruct, achieving 60% sparsity using the Wanda method, without quantization.
Large Language Model
Transformers English

S
IntelLabs
110
2
IF PromptMKR Phi
A version fine-tuned using the IFprompMKR dataset with qlora on the microsoft/phi-1_5 model, primarily for text generation tasks.
Large Language Model
Transformers

I
impactframes
23
2
Featured Recommended AI Models